Subgradient Methods

نویسندگان

  • Stephen Boyd
  • Almir Mutapcic
چکیده

3 Convergence proof 4 3.1 Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3.2 Some basic inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3.3 A bound on the suboptimality bound . . . . . . . . . . . . . . . . . . . . . . 7 3.4 A stopping criterion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 3.5 Numerical example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Subgradient Methods

Stochastic subgradient methods play an important role in machine learning. We introduced the concepts of subgradient methods and stochastic subgradient methods in this project, discussed their convergence conditions as well as the strong and weak points against their competitors. We demonstrated the application of (stochastic) subgradient methods to machine learning with a running example of tr...

متن کامل

A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

متن کامل

On the Efficiency of the €-subgradient Methods over Nonlinearly Constrained Networks

The efficiency of the network flow techniques can be exploited in the solution of nonlinearly constrained network flow problems by means of approximate subgradient methods. In particular, we consider the case where the side constraints (non-network constraints) are convex. We propose to solve the dual problem by using &-subgradient methods given that the dual function is estimated by minimizing...

متن کامل

Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization

Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...

متن کامل

Inexact subgradient methods over nonlinearly constrained networks

The minimization of nonlinearly constrained network flow problems can be performed by using approximate subgradient methods. The idea is to solve this kind of problem by means of primal-dual methods, given that the minimization of nonlinear network flow problems can be efficiently done by exploiting the network structure. In this work it is proposed to solve the dual problem by using 2subgradie...

متن کامل

Lecture 2: Subgradient Methods

In this lecture, we discuss first order methods for the minimization of convex functions. We focus almost exclusively on subgradient-based methods, which are essentially universally applicable for convex optimization problems, because they rely very little on the structure of the problem being solved. This leads to effective but slow algorithms in classical optimization problems, however, in la...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007